23 research outputs found

    Scene camera movies from mobile eye tracker

    No full text
    <p>These are the full recorded scene camera scenes.  Eye position data for each scene can also be found here as well as an excel file detailing which parts of the clips we used.</p

    High temporal frequency adaptation compresses time in the Flash-Lag illusion

    Get PDF
    AbstractPrevious research finds that 20Hz temporal frequency (TF) adaptation causes a compression of perceived visual event duration. We investigate if this temporal compression affects further time-dependent percepts, implying a further functional role for duration perception mechanisms. We measure the effect of 20Hz flicker adaptation on Flash-Lag, an illusion whereby an observer perceives a moving object displaced further along its trajectory compared to a spatially localized briefly flashed object. The illusion scales with object speed; therefore, it has a fixed temporal component. By comparing adaptation at 5Hz and 20Hz we show that 20Hz TF adaptation reduces perceived Flash-Lag magnitude significantly, with no effect at 5Hz, whereas the opposite pattern of adaptation was seen on perceived speed. There is a significant effect of 20Hz adaptation on the perceived duration of a moving bar. This suggests that 20Hz TF adaptation has compressed the fixed temporal component of the Flash-Lag illusion, implying the mechanism underlying duration perception also has effects on judging spatial relationships in dynamic stimuli

    The combined effect of eye movements improve head centred local motion information during walking.

    No full text
    Eye movements play multiple roles in human behaviour-small stabilizing movements are important for keeping the image of the scene steady during locomotion, whilst large scanning movements search for relevant information. It has been proposed that eye movement induced retinal motion interferes with the estimation of self-motion based on optic flow. We investigated the effect of eye movements on retinal motion information during walking. Observers walked towards a target, wearing eye tracking glasses that simultaneously recorded the scene ahead and tracked the movements of both eyes. By realigning the frames of the recording from the scene ahead, relative to the centre of gaze, we could mimic the input received by the retina (retinocentric coordinates) and compare this to the input received by the scene camera (head centred coordinates). We asked which of these coordinate frames resulted in the least noisy motion information. Motion noise was calculated by finding the error in between the optic flow signal and a noise-free motion expansion pattern. We found that eye movements improved the optic flow information available, even when large diversions away from target were made
    corecore